130 research outputs found

    Doves and hawks in economics revisited [An evolutionary quantum game theory-based analysis of financial crises]

    Get PDF
    The last financial and economic crisis demonstrated the dysfunctional long-term effects of aggressive behaviour in financial markets. Yet, evolutionary game theory predicts that under the condition of strategic dependence a certain degree of aggressive behaviour remains within a given population of agents. However, as the consequences of the financial crisis exhibit, it would be desirable to change the 'rules of the game' in a way that prevents the occurrence of any aggressive behaviour and thereby also the danger of market crashes. The paper picks up this aspect. Through the extension of the in literature well-known Hawk-Dove game by a quantum approach, we can show that dependent on entanglement, also evolutionary stable strategies can emerge, which are not predicted by classical evolutionary game theory and where the total economic population uses a non aggressive quantum strategy.Evolutionary game theory; financial crisis; hawk-dove game; quantum game theory

    RFID’s Impact on Logistic Operations: Towards a Comprehensive Empirical Assessment

    Get PDF
    Radio Frequency Identification (RFID) is an information technology whose appeal to practitioners and researchers remains high. Although an impressive amount of research addresses its use in various applications, its impact on logistic operations, where it is expected to yield the most significant benefits, has been poorly understood to date. In particular, stringent empirical research along these lines is still missing and many questions remain unanswered. Which generic activities are crucial when introducing RFID in logistic processes? Are detailed planning, expertise, standards, and collaboration with business partners actually essential to achieve RFID profitability or is RFID introduction easier than expected by most decision makers. Does most of the value created by RFID projects come from improved automation, visibility, or completely new ways to do business? In order to answer these questions, we have designed a survey instrument which is described and evaluated in this paper1. The hypotheses motivated in this document will soon be tested using a representative sample of RFID adopters and the structural equation modelling methodology

    COMMUNICATION ANONYMIZERS: PERSONALITY, INTERNET PRIVACY LITERACY AND THEIR INFLUENCE ON TECHNOLOGY ACCEPTANCE

    Get PDF
    Despite the fact that many individuals are concerned about privacy issues on the Internet and know about the existence of communication anonymizers, very few individuals actually use them. This discrepancy can only partially be explained by evident factors such as a small degree of knowledge about Internet privacy issues, or the latency of the Internet connection caused by communication anonymizers. In this study, we determine factors that influence the acceptance of communication anonymizers: the role of personality traits of individuals, the actual knowledge about privacy issues on the Internet and how much individuals really know about them, as well as the time an individual is willing to wait when using a communication anonymizer. Our study shows that the personality traits ?Agreeableness,? ?Extroversion? and ?Conscientiousness? do not influence an individual?s acceptance of communication anonymizers. Further, we can show that individuals with a strong personality trait of neuroticism are more likely to have strong privacy concerns and that individuals that can be characterized as ?open? are more likely to use communication anonymizers. With regard to the knowledge about privacy issues on the Internet, we find that individuals generally possess a low knowledge. Surprisingly, we find a negative correlation between an individual?s ?stated? and his/her ?actual? knowledge of privacy issues. Last, we find that individuals are willing to wait slightly longer (3.5 seconds) when using communication anonymizers

    ARE YOU WILLING TO WAIT LONGER FOR INTERNET PRIVACY?

    Get PDF
    It becomes increasingly common for governments, service providers and specialized data aggregators to systematically collect traces of personal communication on the Internet without the user’s knowledge or approval. An analysis of these personal traces by data mining algorithms can reveal sensitive personal information, such as location data, behavioral patterns, or personal profiles including preferences and dislikes. Recent studies show that this information can be used for various purposes, for example by insurance companies or banks to identify potentially risky customers, by governments to observe their citizens, and also by repressive regimes to monitor political opponents. Online anonymity software, such as Tor, can help users to protect their privacy, but often comes at the prize of low usability, e.g., by causing increased latency during surfing. In this exploratory study, we determine factors that influence the usage of Internet anonymity software. In particular, we show that Internet literacy, Internet privacy awareness and Internet privacy concerns are important antecedents for determining an Internet user’s intention to use anonymity software, and that Internet patience has a positive moderating effect on the intention to use anonymity software, as well as on its perceived usefulness

    Distributed Performance Measurement and Usability Assessment of the Tor Anonymization Network

    Get PDF
    While the Internet increasingly permeates everyday life of individuals around the world, it becomes crucial to prevent unauthorized collection and abuse of personalized information. Internet anonymization software such as Tor is an important instrument to protect online privacy. However, due to the performance overhead caused by Tor, many Internet users refrain from using it. This causes a negative impact on the overall privacy provided by Tor, since it depends on the size of the user community and availability of shared resources. Detailed measurements about the performance of Tor are crucial for solving this issue. This paper presents comparative experiments on Tor latency and throughput for surfing to 500 popular websites from several locations around the world during the period of 28 days. Furthermore, we compare these measurements to critical latency thresholds gathered from web usability research, including our own user studies. Our results indicate that without massive future optimizations of Tor performance, it is unlikely that a larger part of Internet users would adopt it for everyday usage. This leads to fewer resources available to the Tor community than theoretically possible, and increases the exposure of privacy-concerned individuals. Furthermore, this could lead to an adoption barrier of similar privacy-enhancing technologies for a Future Internet. View Full-Tex

    Privately Waiting – A Usability Analysis of the Tor Anonymity Network

    Get PDF
    As the Internet is increasingly absorbing information from the real world it becomes more important to prevent unauthorized collection and abuse of personalized information. At the same time, democratic societies should establish an environment helping not only their own people but also people who face repressive censorship to access public information without being identified or traced. Internet anonymization tools such as Tor offer functionalities to meet this demand. In practice, anonymization of Internet access can only be achieved by accepting higher latency, i.e., a longer waiting time before a Web site is displayed in the browser, and therefore reducing its usability significantly. Since many users may not be willing to accept this loss of usability, they may refrain from or stop using Tor – at the same time decreasing the anonymity of other users, which depends on shared resources in the Tor user community. In this paper1, we quantify the loss of usability by measuring the additional latency of the Tor software and combine our measurements with metrics of the existing Web usability and performance literature. Our findings indicate that there is still a major usability gap induced by Tor, leading to its possible disuse accompanied by a higher risk exposure of Internet users

    The impact of ocean warming and acidification on the behaviour of two co-occurring Gadid species, Boreogadus saida and Gadus morhua from Svalbard

    Get PDF
    Ocean acidification induces strong behavioural alterations in marine fish as a conse- quence of acid−base regulatory processes in response to increasing environmental CO2 partial pressure. While these changes have been investigated in tropical and temperate fish species, nothing is known about behavioural effects on polar species. In particular, fishes of the Arctic Ocean will experience much greater acidification and warming than temperate or tropical species. Also, possible interactions of ocean warming and acidification are still understudied. Here we analysed the combined effects of warming and acidification on behavioural patterns of 2 fish species co-occurring around Svalbard, viz. polar cod Boreogadus saida and Atlantic cod Gadus morhua. We found a significant temperature effect on the spontaneous activity of B. saida, but not of G. morhua. Environmental CO2 did not significantly influence activity of either species. In con- trast, behavioural laterality of B. saida was affected by CO2 but not by temperature. Behavioural laterality of G. morhua was not affected by temperature or CO2; however, in this species, a possi- ble temperature dependency of CO2 effects on relative laterality may have been missed due to sample size restrictions. This study indicates that fish in polar ecosystems may undergo some, albeit less intense, behavioural disturbances under ocean acidification and in combination with ocean warming than observed in tropical species. It further accentuates species-specific differ- ences in vulnerability

    Cerebral attenuation on single-phase CT angiography source images: Automated ischemia detection and morphologic outcome prediction after thrombectomy in patients with ischemic stroke

    Get PDF
    Objectives: Stroke triage using CT perfusion (CTP) or MRI gained importance after successful application in recent trials on late-window thrombectomy but is often unavailable and time-consuming. We tested the clinical value of software-based analysis of cerebral attenuation on Single-phase CT angiography source images (CTASI) as CTP surrogate in stroke patients. Methods: Software-based automated segmentation and Hounsfield unit (HU) measurements for all regions of the Alberta Stroke Program Early CT Score (ASPECTS) on CTASI were performed in patients with large vessel occlusion stroke who underwent thrombectomy. To normalize values, we calculated relative HU (rHU) as ratio of affected to unaffected hemisphere. Ischemic regions, regional ischemic core and final infarction were determined on simultaneously acquired CTP and follow-up imaging as ground truth. Receiver operating characteristics analysis was performed to calculate the area-under-the-curve (AUC). Resulting cut-off values were used for comparison with visual analysis and to calculate an 11-point automated CTASI ASPECTS. Results: Seventy-nine patients were included. rHU values enabled significant classification of ischemic involvement on CTP in all ten regions of the ASPECTS (each p<0.001, except M4-cortex p = 0.002). Classification of ischemic core and prediction of final infarction had best results in subcortical regions but produced lower AUC values with significant classification for all regions except M1, M3 and M5. Relative total hemispheric attenuation provided strong linear correlation with CTP total ischemic volume. Automated classification of regional ischemia on CTASI was significantly more accurate in most regions and provided better agreement with CTP cerebral blood flow ASPECTS than visual assessment. Conclusions: Automated attenuation measurements on CTASI provide excellent performance in detecting acute ischemia as identified on CTP with improved accuracy compared to visual analysis. However, value for the approximation of ischemic core and morphologic outcome in large vessel occlusion stroke after thrombectomy was regionally dependent and limited. This technique has the potential to facilitate stroke imaging as sensitive surrogate for CTP-based ischemia

    Performance of Automated Attenuation Measurements at Identifying Large Vessel Occlusion Stroke on CT Angiography

    Get PDF
    PURPOSE Computed tomography angiography (CTA) is routinely used to detect large-vessel occlusion (LVO) in patients with suspected acute ischemic stroke; however, visual analysis is time consuming and prone to error. To evaluate solutions to support imaging triage, we tested performance of automated analysis of CTA source images (CTASI) at identifying patients with LVO. METHODS Stroke patients with LVO were selected from a prospectively acquired cohort. A control group was selected from consecutive patients with clinically suspected stroke without signs of ischemia on CT perfusion (CTP) or infarct on follow-up. Software-based automated segmentation and Hounsfield unit (HU) measurements were performed on CTASI for all regions of the Alberta Stroke Program Early CT score (ASPECTS). We derived different parameters from raw measurements and analyzed their performance to identify patients with LVO using receiver operating characteristic curve analysis. RESULTS The retrospective analysis included 145 patients, 79 patients with LVO stroke and 66 patients without stroke. The parameters hemispheric asymmetry ratio (AR), ratio between highest and lowest regional AR and M2-territory AR produced area under the curve (AUC) values from 0.95-0.97 (all p < 0.001) for detecting presence of LVO in the total population. Resulting sensitivity (sens)/specificity (spec) defined by the Youden index were 0.87/0.97-0.99. Maximum sens/spec defined by the specificity threshold ≥0.70 were 0.91-0.96/0.77-0.83. Performance in a~small number of patients with isolated M2 occlusion was lower (AUC: 0.72-0.85). CONCLUSION Automated attenuation measurements on CTASI identify proximal LVO stroke patients with high sensitivity and specificity. This technique can aid in accurate and timely patient selection for thrombectomy, especially in primary stroke centers without CTP capacity
    • …
    corecore